Several recent research reports have helped fuel political battles over charter schools. Opponents of charter schools, in particular, have seized on these reports to argue that charters have failed.
The CREDO study
A study of student achievement in 16 states was published by Stanford University's Center for Research on Education Outcomes (CREDO) in June 2009. It compared students who transferred to charter schools with a matched group of students from the same public schools who continued attending traditional public schools.
The study reported that students in 17% of the charter schools had significantly better gains than those attending the matched traditional public schools. In 37% they did worse. And in 46% the results were not significantly different. However, the report also found wide variation among states. as shown in the table below (states or cities listed in italics were not included in the original report):
Charters did | Better | About the Same | Worse |
---|---|---|---|
State | Arkansas, Denver, Chicago, Indiana, Louisiana, Missouri, New York City | California, District of Columbia, Georgia, North Carolina | Arizona, Florida, Minnesota, New Mexico, Ohio, Pennsylvania, Texas |
The study reported poorer charter performance in states with multiple charter authorizers and in states capping charter school growth. Charter schools did better with low income students and English language learners.
The What Works Clearinghouse recently issued a Quick Review of the CREDO report on Indiana charter schools, saying the "report meets WWC evidence standards with reservations." The reservations stem from a concern that unobserved differences (such as in motivation) might have influenced the results. Presumably the same rating would apply to the studies of other states since the same methodology was used.
Mathematica Study
A study undertaken by Mathematica Policy Research for the US Institute of Education Sciences was conducted in 36 charter middle schools in 15 states with admission lotteries. It identified 2,330 students who applied to these schools and compared the outcomes of those who were randomly assigned to be admitted (lottery winners) to the outcomes of those not admitted (lottery losers). Both sets of students were tracked over two years. The What Works Clearinghouse described this study as "a well-implemented randomized controlled trial."
The Mathematica study found that, on average, charter middle schools were neither more nor less successful than traditional public schools in improving math or reading test scores. As with the CREDO study described earlier, it found that charter schools' impact on student achievement varied significantly from one school to another.
The study found that charter middle schools in urban areas—as well as those serving higher proportions of low-income and low achieving students—were more effective relative to their nearby traditional public schools than were other charter schools in improving math test scores.
Other charter school features with positive effects on achievement included smaller enrollments and the use of ability grouping in math and English classes.
Unlike the CREDO study, this report found no significant relationship between achievement impacts and the charter schools' policy environment. However, it appears the two studies defined policy environment somewhat differently.
Common Results
Despite the very different designs, both studies arrived at similar conclusions in several areas. Both found that achievement differences between individual schools, both charter and traditional public, far exceeded the average differences between the two groups of schools.
A second common finding is that charter schools do better on average with low-income students. It would be useful to know more about the practices of charter schools serving low-income students compared to those serving middle class students.
With increasing attempts by charter authorizers to hold charter schools accountable for results, it would be useful to know whether the schools whose students did better had higher scores on the measures authorizers use, such as those recommended by the National Association of Charter School Authorizers. Neither study tried to examine this relationship, although the author of the CREDO study responded to an inquiry saying she hoped to get funding to do so.
Other Studies
Most of the other studies comparing the subsequent achievement of students accepted and not accepted into charter schools are limited to one or two states (for a discussion of two studies of New York City charter school, click here). Given the wide variation in results by state, it seems risky to generalize from studies of schools in a particular state or city to conclusions about the relative effectiveness of charter schools as a whole.
Another possible approach to comparing charter schools to traditional public schools is to use the National Assessment of Educational Progress (NAEP) mathematics or reading test. Click here for a discussion of how to use the NAEP data tool to make such comparisons. Probably the greatest limitation of using NAEP scores to compare schools is that there is no information on student growth.